Linear Algebra
Linear Algebra
1.I.1H
Part IB, 2004 commentSuppose that is a linearly independent set of distinct elements of a vector space and spans . Prove that may be reordered, as necessary, so that spans .
Suppose that is a linearly independent set of distinct elements of and that spans . Show that .
1.II.12H
Part IB, 2004 commentLet and be subspaces of the finite-dimensional vector space . Prove that both the sum and the intersection are subspaces of . Prove further that
Let be the kernels of the maps given by the matrices and respectively, where
Find a basis for the intersection , and extend this first to a basis of , and then to a basis of .
2.I.1E
Part IB, 2004 commentFor each let be the matrix defined by
What is Justify your answer.
[It may be helpful to look at the cases before tackling the general case.]
2.II.12E
Part IB, 2004 commentLet be a quadratic form on a real vector space of dimension . Prove that there is a basis with respect to which is given by the formula
Prove that the numbers and are uniquely determined by the form . By means of an example, show that the subspaces and need not be uniquely determined by .
3.I.1E
Part IB, 2004 commentLet be a finite-dimensional vector space over . What is the dual space of ? Prove that the dimension of the dual space is the same as that of .
3.II.13E
Part IB, 2004 comment(i) Let be an -dimensional vector space over and let be an endomorphism. Suppose that the characteristic polynomial of is , where the are distinct and for every .
Describe all possibilities for the minimal polynomial and prove that there are no further ones.
(ii) Give an example of a matrix for which both the characteristic and the minimal polynomial are .
(iii) Give an example of two matrices with the same rank and the same minimal and characteristic polynomials such that there is no invertible matrix with .
4.I.1E
Part IB, 2004 commentLet be a real -dimensional inner-product space and let be a dimensional subspace. Let be an orthonormal basis for . In terms of this basis, give a formula for the orthogonal projection .
Let . Prove that is the closest point in to .
[You may assume that the sequence can be extended to an orthonormal basis of .]
4.II.11E
Part IB, 2004 comment(i) Let be an -dimensional inner-product space over and let be a Hermitian linear map. Prove that has an orthonormal basis consisting of eigenvectors of .
(ii) Let be another Hermitian map. Prove that is Hermitian if and only if .
(iii) A Hermitian map is positive-definite if for every non-zero vector . If is a positive-definite Hermitian map, prove that there is a unique positivedefinite Hermitian map such that .
1.I.1C
Part IB, 2005 commentLet be an -dimensional vector space over , and let be a linear map. Define the minimal polynomial of . Prove that is invertible if and only if the constant term of the minimal polynomial of is non-zero.
1.II.9C
Part IB, 2005 commentLet be a finite dimensional vector space over , and be the dual space of .
If is a subspace of , we define the subspace of by
Prove that . Deduce that, if is any real -matrix of rank , the equations
have linearly independent solutions in .
2.I.1C
Part IB, 2005 commentLet be the set of all matrices of the form , where are in , and
Prove that is closed under multiplication and determine its dimension as a vector space over . Prove that
and deduce that each non-zero element of is invertible.
2.II.10C
Part IB, 2005 comment(i) Let be an matrix with entries in C. Define the determinant of , the cofactor of each , and the adjugate matrix . Assuming the expansion of the determinant of a matrix in terms of its cofactors, prove that
where is the identity matrix.
(ii) Let
Show the eigenvalues of are , where , and determine the diagonal matrix to which is similar. For each eigenvalue, determine a non-zero eigenvector.
3.II.10B
Part IB, 2005 commentLet be the vector space of functions such that the th derivative of is defined and continuous for every . Define linear maps by and . Show that
where in this question means and is the identity map on .
Now let be any real vector space with linear maps such that . Suppose that there is a nonzero element with . Let be the subspace of spanned by , and so on. Show that is in and give a formula for it. More generally, show that is in for each , and give a formula for it.
Show, using your formula or otherwise, that are linearly independent. (Or, equivalently: show that are linearly independent for every .)
4.I.1B
Part IB, 2005 commentDefine what it means for an complex matrix to be unitary or Hermitian. Show that every eigenvalue of a Hermitian matrix is real. Show that every eigenvalue of a unitary matrix has absolute value 1 .
Show that two eigenvectors of a Hermitian matrix that correspond to different eigenvalues are orthogonal, using the standard inner product on .
4.II.10B
Part IB, 2005 comment(i) Let be a finite-dimensional real vector space with an inner product. Let be a basis for . Prove by an explicit construction that there is an orthonormal basis for such that the span of is equal to the span of for every .
(ii) For any real number , consider the quadratic form
on . For which values of is nondegenerate? When is nondegenerate, compute its signature in terms of .
1.I.1H
Part IB, 2006 commentDefine what is meant by the minimal polynomial of a complex matrix, and show that it is unique. Deduce that the minimal polynomial of a real matrix has real coefficients.
For , find an matrix with minimal polynomial .
1.II.9H
Part IB, 2006 commentLet be finite-dimensional vector spaces, and let be a linear map of into . Define the rank and the nullity of , and prove that
Now let be endomorphisms of a vector space . Define the endomorphisms and , and prove that
Prove that equality holds in both inequalities if and only if is an isomorphism and is zero.
2.I.1E
Part IB, 2006 commentState Sylvester's law of inertia.
Find the rank and signature of the quadratic form on given by
2.II.10E
Part IB, 2006 commentSuppose that is the set of complex polynomials of degree at most in the variable . Find the dimension of as a complex vector space.
Define
Find a subset of that is a basis of the dual vector space . Find the corresponding dual basis of .
Define
Write down the matrix of with respect to the basis of that you have just found, and the matrix of the map dual to with respect to the dual basis.
3.II.10H
Part IB, 2006 comment(a) Define what is meant by the trace of a complex matrix . If denotes an invertible matrix, show that and have the same trace.
(b) If are distinct non-zero complex numbers, show that the endomorphism of defined by the matrix
has trivial kernel, and hence that the same is true for the transposed matrix .
For arbitrary complex numbers , show that the vector is not in the kernel of the endomorphism of defined by the matrix
unless all the are zero.
[Hint: reduce to the case when are distinct non-zero complex numbers, with , and each for is either zero or equal to some with . If the kernel of the endomorphism contains , show that it also contains a vector of the form with the strictly positive integers.]
(c) Assuming the fact that any complex matrix is conjugate to an uppertriangular one, prove that if is an matrix such that has zero trace for all , then
4.I.1H
Part IB, 2006 commentSuppose is a vector space over a field . A finite set of vectors is said to be a basis for if it is both linearly independent and spanning. Prove that any two finite bases for have the same number of elements.
4.II.10E
Part IB, 2006 commentSuppose that is an orthogonal endomorphism of the finite-dimensional real inner product space . Suppose that is decomposed as a direct sum of mutually orthogonal -invariant subspaces. How small can these subspaces be made, and how does act on them? Justify your answer.
Describe the possible matrices for with respect to a suitably chosen orthonormal basis of when .
1.I.1G
Part IB, 2007 commentSuppose that is a basis of the complex vector space and that is the linear operator defined by , and .
By considering the action of on column vectors of the form , where , or otherwise, find the diagonalization of and its characteristic polynomial.
1.II.9G
Part IB, 2007 commentState and prove Sylvester's law of inertia for a real quadratic form.
[You may assume that for each real symmetric matrix A there is an orthogonal matrix , such that is diagonal.]
Suppose that is a real vector space of even dimension , that is a non-singular quadratic form on and that is an -dimensional subspace of on which vanishes. What is the signature of
2.I.1G
Part IB, 2007 commentSuppose that are endomorphisms of the 3-dimensional complex vector space and that the eigenvalues of each of them are . What are their characteristic and minimal polynomials? Are they conjugate?
2.II.10G
Part IB, 2007 commentSuppose that is the complex vector space of complex polynomials in one variable, .
(i) Show that the form , defined by
is a positive definite Hermitian form on .
(ii) Find an orthonormal basis of for this form, in terms of the powers of .
(iii) Generalize this construction to complex vector spaces of complex polynomials in any finite number of variables.
3.II.10G
Part IB, 2007 comment(i) Define the terms row-rank, column-rank and rank of a matrix, and state a relation between them.
(ii) Fix positive integers with . Suppose that is an matrix and a matrix. State and prove the best possible upper bound on the rank of the product .
4.I.1G
Part IB, 2007 commentSuppose that is a linear map of finite-dimensional complex vector spaces. What is the dual map of the dual vector spaces?
Suppose that we choose bases of and take the corresponding dual bases of the dual vector spaces. What is the relation between the matrices that represent and with respect to these bases? Justify your answer.
4.II.10G
Part IB, 2007 comment(i) State and prove the Cayley-Hamilton theorem for square complex matrices.
(ii) A square matrix is of order for a strictly positive integer if and no smaller positive power of is equal to .
Determine the order of a complex matrix of trace zero and determinant 1 .
1.I.1E
Part IB, 2008 commentLet be an matrix over . What does it mean to say that is an eigenvalue of ? Show that has at least one eigenvalue. For each of the following statements, provide a proof or a counterexample as appropriate.
(i) If is Hermitian, all eigenvalues of are real.
(ii) If all eigenvalues of are real, is Hermitian.
(iii) If all entries of are real and positive, all eigenvalues of have positive real part.
(iv) If and have the same trace and determinant then they have the same eigenvalues.
1.II.9E
Part IB, 2008 commentLet be an matrix of real numbers. Define the row rank and column rank of and show that they are equal.
Show that if a matrix is obtained from by elementary row and column operations then .
Let and be matrices. Show that the matrices and have the same rank.
Hence, or otherwise, prove that
2.I.1E
Part IB, 2008 commentSuppose that and are finite-dimensional vector spaces over . What does it mean to say that is a linear map? State the rank-nullity formula. Using it, or otherwise, prove that a linear map is surjective if, and only if, it is injective.
Suppose that is a linear map which has a right inverse, that is to say there is a linear map such that , the identity map. Show that .
Suppose that and are two matrices over such that . Prove that .
2.II.10E
Part IB, 2008 commentDefine the determinant of an square matrix over the complex numbers. If and are two such matrices, show that .
Write for the characteristic polynomial of a matrix . Let be matrices and suppose that is nonsingular. Show that . Taking for appropriate values of , or otherwise, deduce that .
Show that if then . Which of the following statements is true for all matrices ? Justify your answers.
(i) ;
(ii) .
3.II.10E
Part IB, 2008 commentLet or . What is meant by a quadratic form ? Show that there is a basis for such that, writing , we have for some scalars
Suppose that . Define the rank and signature of and compute these quantities for the form given by .
Suppose now that and that are quadratic forms. If , show that there is some nonzero such that .
4.I.1E
Part IB, 2008 commentDescribe (without proof) what it means to put an matrix of complex numbers into Jordan normal form. Explain (without proof) the sense in which the Jordan normal form is unique.
Put the following matrix in Jordan normal form:
4.II.10E
Part IB, 2008 commentWhat is meant by a Hermitian matrix? Show that if is Hermitian then all its eigenvalues are real and that there is an orthonormal basis for consisting of eigenvectors of .
A Hermitian matrix is said to be positive definite if for all . We write in this case. Show that is positive definite if, and only if, all of its eigenvalues are positive. Show that if then has a unique positive definite square root .
Let be two positive definite Hermitian matrices with . Writing and , show that . By considering eigenvalues of , or otherwise, show that .
Paper 4, Section I, G
Part IB, 2009 commentShow that every endomorphism of a finite-dimensional vector space satisfies some polynomial, and define the minimal polynomial of such an endomorphism.
Give a linear transformation of an eight-dimensional complex vector space which has minimal polynomial .
Paper 1, Section I, G
Part IB, 2009 comment(1) Let be a finite-dimensional vector space and let be a non-zero endomorphism of . If show that the dimension of is an even integer. Find the minimal polynomial of . [You may assume the rank-nullity theorem.]
(2) Let , be non-zero subspaces of a vector space with the property that
Show that there is a 2-dimensional subspace for which all the are one-dimensional.
Paper 2, Section I,
Part IB, 2009 commentLet denote the vector space of polynomials in two variables of total degree at most . Find the dimension of .
If is defined by
find the kernel of and the image of . Compute the trace of for each with .
Paper 1, Section II, G
Part IB, 2009 commentDefine the dual of a vector space . State and prove a formula for its dimension.
Let be the vector space of real polynomials of degree at most . If are distinct real numbers, prove that there are unique real numbers with
for every .
Paper 3, Section II, G
Part IB, 2009 commentFor each of the following, provide a proof or counterexample.
(1) If are complex matrices and , then and have a common eigenvector.
(2) If are complex matrices and , then and have a common eigenvalue.
(3) If are complex matrices and then .
(4) If is an endomorphism of a finite-dimensional vector space and is an eigenvalue of , then the dimension of equals the multiplicity of as a root of the minimal polynomial of .
(5) If is an endomorphism of a finite-dimensional complex vector space , is an eigenvalue of , and , then where is the multiplicity of as a root of the minimal polynomial of .
Paper 4, Section II, G
Part IB, 2009 commentWhat does it mean to say two real symmetric bilinear forms and on a vector space are congruent ?
State and prove Sylvester's law of inertia, and deduce that the rank and signature determine the congruence class of a real symmetric bilinear form. [You may use without proof a result on diagonalisability of real symmetric matrices, provided it is clearly stated.]
How many congruence classes of symmetric bilinear forms on a real -dimensional vector space are there? Such a form defines a family of subsets , for . For how many of the congruence classes are these associated subsets all bounded subsets of ? Is the quadric surface
a bounded or unbounded subset of ? Justify your answers.
Paper 2, Section II, G
Part IB, 2009 commentLet be a finite-dimensional vector space and let be an endomorphism of . Show that there is a positive integer such that . Hence, or otherwise, show that if has zero determinant there is some non-zero endomorphism with .
Suppose and are endomorphisms of for which . Show that is similar to if and only if they have the same rank.
Paper 1, Section I, F
Part IB, 2010 commentSuppose that is the complex vector space of polynomials of degree at most in the variable . Find the Jordan normal form for each of the linear transformations and acting on .
Paper 2, Section I, F
Part IB, 2010 commentSuppose that is an endomorphism of a finite-dimensional complex vector space.
(i) Show that if is an eigenvalue of , then is an eigenvalue of .
(ii) Show conversely that if is an eigenvalue of , then there is an eigenvalue of with .
Paper 4, Section I, F
Part IB, 2010 commentDefine the notion of an inner product on a finite-dimensional real vector space , and the notion of a self-adjoint linear map .
Suppose that is the space of real polynomials of degree at most in a variable . Show that
is an inner product on , and that the map :
is self-adjoint.
Paper 1, Section II, F
Part IB, 2010 commentLet denote the vector space of real matrices.
(1) Show that if , then is a positive-definite symmetric bilinear form on .
(2) Show that if , then is a quadratic form on . Find its rank and signature.
[Hint: Consider symmetric and skew-symmetric matrices.]
Paper 2, Section II, F
Part IB, 2010 comment(i) Show that two complex matrices are similar (i.e. there exists invertible with ) if and only if they represent the same linear map with respect to different bases.
(ii) Explain the notion of Jordan normal form of a square complex matrix.
(iii) Show that any square complex matrix is similar to its transpose.
(iv) If is invertible, describe the Jordan normal form of in terms of that of .
Justify your answers.
Paper 3, Section II, F
Part IB, 2010 commentSuppose that is a finite-dimensional vector space over , and that is a -linear map such that for some . Show that if is a subspace of such that , then there is a subspace of such that and .
[Hint: Show, for example by picking bases, that there is a linear map with for all . Then consider with
Paper 4, Section II, F
Part IB, 2010 comment(i) Show that the group of orthogonal real matrices has a normal subgroup .
(ii) Show that if and only if is odd.
(iii) Show that if is even, then is not the direct product of with any normal subgroup.
[You may assume that the only elements of that commute with all elements of are .]
Paper 1, Section I, G
Part IB, 2011 comment(i) State the rank-nullity theorem for a linear map between finite-dimensional vector spaces.
(ii) Show that a linear transformation of a finite-dimensional vector space is bijective if it is injective or surjective.
(iii) Let be the -vector space of all polynomials in with coefficients in . Give an example of a linear transformation which is surjective but not bijective.
Paper 2, Section , G
Part IB, 2011 commentLet be an -dimensional -vector space with an inner product. Let be an -dimensional subspace of and its orthogonal complement, so that every element can be uniquely written as for and .
The reflection map with respect to is defined as the linear map
Show that is an orthogonal transformation with respect to the inner product, and find its determinant.
Paper 4, Section I, G
Part IB, 2011 comment(i) Let be a vector space over a field , and subspaces of . Define the subset of , and show that and are subspaces of .
(ii) When are finite-dimensional, state a formula for in terms of and .
(iii) Let be the -vector space of all matrices over . Let be the subspace of all symmetric matrices and the subspace of all upper triangular matrices (the matrices such that whenever . Find and . Briefly justify your answer.
Paper 1, Section II, G
Part IB, 2011 commentLet be finite-dimensional vector spaces over a field and a linear map.
(i) Show that is injective if and only if the image of every linearly independent subset of is linearly independent in .
(ii) Define the dual space of and the dual map .
(iii) Show that is surjective if and only if the image under of every linearly independent subset of is linearly independent in .
Paper 2, Section II, G
Part IB, 2011 commentLet be a positive integer, and let be a -vector space of complex-valued functions on , generated by the set .
(i) Let for . Show that this is a positive definite Hermitian form on .
(ii) Let . Show that is a self-adjoint linear transformation of with respect to the form defined in (i).
(iii) Find an orthonormal basis of with respect to the form defined in (i), which consists of eigenvectors of .
Paper 3, Section II, G
Part IB, 2011 comment(i) Let be an complex matrix and a polynomial with complex coefficients. By considering the Jordan normal form of or otherwise, show that if the eigenvalues of are then the eigenvalues of are .
(ii) Let . Write as for a polynomial with , and find the eigenvalues of
[Hint: compute the powers of .]
Paper 4, Section II, G
Part IB, 2011 commentLet be an -dimensional -vector space and linear transformations. Suppose is invertible and diagonalisable, and for some real number .
(i) Show that is nilpotent, i.e. some positive power of is 0 .
(ii) Suppose that there is a non-zero vector with and . Determine the diagonal form of .
Paper 4, Section I, F
Part IB, 2012 commentLet be a complex vector space with basis . Define by for and . Show that is diagonalizable and find its eigenvalues. [You may use any theorems you wish, as long as you state them clearly.]
Paper 2, Section I,
Part IB, 2012 commentDefine the determinant of an real matrix . Suppose that is a matrix with block form
where and are matrices of dimensions and respectively. Show that .
Paper 1, Section I, F
Part IB, 2012 commentDefine the notions of basis and dimension of a vector space. Prove that two finitedimensional real vector spaces with the same dimension are isomorphic.
In each case below, determine whether the set is a basis of the real vector space
(i) is the complex numbers; .
(ii) is the vector space of all polynomials in with real coefficients;
(iii) , where
Paper 1, Section II, F
Part IB, 2012 commentDefine what it means for two matrices to be similar to each other. Show that if two matrices are similar, then the linear transformations they define have isomorphic kernels and images.
If and are real matrices, we define . Let
Show that and are linear subspaces of . If and are similar, show that and .
Suppose that is diagonalizable and has characteristic polynomial
where . What are and
Paper 4, Section II, F
Part IB, 2012 commentLet be a finite-dimensional real vector space of dimension . A bilinear form is nondegenerate if for all in , there is some with . For , define . Assuming is nondegenerate, show that whenever .
Suppose that is a nondegenerate, symmetric bilinear form on . Prove that there is a basis of with for . [If you use the fact that symmetric matrices are diagonalizable, you must prove it.]
Define the signature of a quadratic form. Explain how to determine the signature of the quadratic form associated to from the basis you constructed above.
A linear subspace is said to be isotropic if for all . Show that if is nondegenerate, the maximal dimension of an isotropic subspace of is , where is the signature of the quadratic form associated to .
Paper 3, Section II, F
Part IB, 2012 commentWhat is meant by the Jordan normal form of an complex matrix?
Find the Jordan normal forms of the following matrices:
Suppose is an invertible complex matrix. Explain how to derive the characteristic and minimal polynomials of from the characteristic and minimal polynomials of . Justify your answer. [Hint: write each polynomial as a product of linear factors.]
Paper 2, Section II, F
Part IB, 2012 comment(i) Define the transpose of a matrix. If and are finite-dimensional real vector spaces, define the dual of a linear map . How are these two notions related?
Now suppose and are finite-dimensional inner product spaces. Use the inner product on to define a linear map and show that it is an isomorphism. Define the adjoint of a linear map . How are the adjoint of and its dual related? If is a matrix representing , under what conditions is the adjoint of represented by the transpose of ?
(ii) Let be the vector space of continuous real-valued functions on , equipped with the inner product
Let be the linear map
What is the adjoint of
Paper 4, Section I, E
Part IB, 2013 commentWhat is a quadratic form on a finite dimensional real vector space ? What does it mean for two quadratic forms to be isomorphic (i.e. congruent)? State Sylvester's law of inertia and explain the definition of the quantities which appear in it. Find the signature of the quadratic form on given by , where
Paper 2, Section I, E
Part IB, 2013 commentIf is an invertible Hermitian matrix, let
Show that with the operation of matrix multiplication is a group, and that det has norm 1 for any . What is the relation between and the complex Hermitian form defined by ?
If is the identity matrix, show that any element of is diagonalizable.
Paper 1, Section I, E
Part IB, 2013 commentWhat is the adjugate of an matrix ? How is it related to ? Suppose all the entries of are integers. Show that all the entries of are integers if and only if .
Paper 1, Section II, E
Part IB, 2013 commentIf and are vector spaces, what is meant by ? If and are subspaces of a vector space , what is meant by ?
Stating clearly any theorems you use, show that if and are subspaces of a finite dimensional vector space , then
Let be subspaces with bases
Find a basis for such that the first component of and the second component of are both 0 .
Paper 4, Section II, E
Part IB, 2013 commentWhat does it mean for an matrix to be in Jordan form? Show that if is in Jordan form, there is a sequence of diagonalizable matrices which converges to , in the sense that the th component of converges to the th component of for all and . [Hint: A matrix with distinct eigenvalues is diagonalizable.] Deduce that the same statement holds for all .
Let . Given , define a linear map by . Express the characteristic polynomial of in terms of the trace and determinant of . [Hint: First consider the case where is diagonalizable.]
Paper 3, Section II, E
Part IB, 2013 commentLet and be finite dimensional real vector spaces and let be a linear map. Define the dual space and the dual map . Show that there is an isomorphism which is canonical, in the sense that for any automorphism of .
Now let be an inner product space. Use the inner product to show that there is an injective map from im to . Deduce that the row rank of a matrix is equal to its column rank.
Paper 2, Section II, E
Part IB, 2013 commentDefine what it means for a set of vectors in a vector space to be linearly dependent. Prove from the definition that any set of vectors in is linearly dependent.
Using this or otherwise, prove that if has a finite basis consisting of elements, then any basis of has exactly elements.
Let be the vector space of bounded continuous functions on . Show that is infinite dimensional.
Paper 4, Section I, G
Part IB, 2014 commentLet denote the vector space of all real polynomials of degree at most 2 . Show that
defines an inner product on .
Find an orthonormal basis for .
Paper 2, Section I, G
Part IB, 2014 commentState and prove the Rank-Nullity Theorem.
Let be a linear map from to . What are the possible dimensions of the kernel of ? Justify your answer.
Paper 1, Section I, G
Part IB, 2014 commentState and prove the Steinitz Exchange Lemma. Use it to prove that, in a finitedimensional vector space: any two bases have the same size, and every linearly independent set extends to a basis.
Let be the standard basis for . Is a basis for Is a basis for Justify your answers.
Paper 1, Section II, G
Part IB, 2014 commentLet be an -dimensional real vector space, and let be an endomorphism of . We say that acts on a subspace if .
(i) For any , show that acts on the linear span of .
(ii) If spans , show directly (i.e. without using the CayleyHamilton Theorem) that satisfies its own characteristic equation.
(iii) Suppose that acts on a subspace with and . Let be a basis for , and extend to a basis for . Describe the matrix of with respect to this basis.
(iv) Using (i), (ii) and (iii) and induction, give a proof of the Cayley-Hamilton Theorem.
[Simple properties of determinants may be assumed without proof.]
Paper 4, Section II, G
Part IB, 2014 commentLet be a real vector space. What is the dual of If is a basis for , define the dual basis for , and show that it is indeed a basis for .
[No result about dimensions of dual spaces may be assumed.]
For a subspace of , what is the annihilator of ? If is -dimensional, how does the dimension of the annihilator of relate to the dimension of ?
Let be a linear map between finite-dimensional real vector spaces. What is the dual map ? Explain why the rank of is equal to the rank of . Prove that the kernel of is the annihilator of the image of , and also that the image of is the annihilator of the kernel of .
[Results about the matrices representing a map and its dual may be used without proof, provided they are stated clearly.]
Now let be the vector space of all real polynomials, and define elements of by setting to be the coefficient of in (for each ). Do the form a basis for ?
Paper 3, Section II, G
Part IB, 2014 commentLet be a nonsingular quadratic form on a finite-dimensional real vector space . Prove that we may write , where the restriction of to is positive definite, the restriction of to is negative definite, and for all and . [No result on diagonalisability may be assumed.]
Show that the dimensions of and are independent of the choice of and . Give an example to show that and are not themselves uniquely defined.
Find such a decomposition when and is the quadratic form
Paper 2, Section II, G
Part IB, 2014 commentDefine the determinant of an complex matrix . Explain, with justification, how the determinant of changes when we perform row and column operations on .
Let be complex matrices. Prove the following statements. (i) . (ii) .
Paper 4, Section I, E
Part IB, 2015 commentDefine the dual space of a vector space . Given a basis of define its dual and show it is a basis of .
Let be a 3-dimensional vector space over and let be the basis of dual to the basis for . Determine, in terms of the , the bases dual to each of the following: (a) , (b) .
Paper 2, Section I,
Part IB, 2015 commentLet denote a quadratic form on a real vector space . Define the rank and signature of .
Find the rank and signature of the following quadratic forms. (a) . (b) .
(c) .
Paper 1, Section I, E
Part IB, 2015 commentLet and be finite dimensional vector spaces and a linear map. Suppose is a subspace of . Prove that
where denotes the rank of and denotes the restriction of to . Give examples showing that each inequality can be both a strict inequality and an equality.
Paper 1, Section II, E
Part IB, 2015 commentDetermine the characteristic polynomial of the matrix
For which values of is invertible? When is not invertible determine (i) the Jordan normal form of , (ii) the minimal polynomial of .
Find a basis of such that is the matrix representing the endomorphism in this basis. Give a change of basis matrix such that .
Paper 4, Section II, E
Part IB, 2015 commentSuppose and are subspaces of a vector space . Explain what is meant by and and show that both of these are subspaces of .
Show that if and are subspaces of a finite dimensional space then
Determine the dimension of the subspace of spanned by the vectors
Write down a matrix which defines a linear map with in the kernel and with image .
What is the dimension of the space spanned by all linear maps
(i) with in the kernel and with image contained in ,
(ii) with in the kernel or with image contained in ?
Paper 3, Section II, E
Part IB, 2015 commentLet be matrices over a field . We say are simultaneously diagonalisable if there exists an invertible matrix such that is diagonal for all . We say the matrices are commuting if for all .
(i) Suppose are simultaneously diagonalisable. Prove that they are commuting.
(ii) Define an eigenspace of a matrix. Suppose are commuting matrices over a field . Let denote an eigenspace of . Prove that for all .
(iii) Suppose are commuting diagonalisable matrices. Prove that they are simultaneously diagonalisable.
(iv) Are the diagonalisable matrices over simultaneously diagonalisable? Explain your answer.
Paper 2, Section II, E
Part IB, 2015 comment(i) Suppose is a matrix that does not have as an eigenvalue. Show that is non-singular. Further, show that commutes with .
(ii) A matrix is called skew-symmetric if . Show that a real skewsymmetric matrix does not have as an eigenvalue.
(iii) Suppose is a real skew-symmetric matrix. Show that is orthogonal with determinant 1 .
(iv) Verify that every orthogonal matrix with determinant 1 which does not have as an eigenvalue can be expressed as where is a real skew-symmetric matrix.
Paper 4, Section I, F
Part IB, 2016 commentFor which real numbers do the vectors
not form a basis of ? For each such value of , what is the dimension of the subspace of that they span? For each such value of , provide a basis for the spanned subspace, and extend this basis to a basis of .
Paper 2, Section I, F
Part IB, 2016 commentFind a linear change of coordinates such that the quadratic form
takes the form
for real numbers and .
Paper 1, Section ,
Part IB, 2016 comment(a) Consider the linear transformation given by the matrix
Find a basis of in which is represented by a diagonal matrix.
(b) Give a list of matrices such that any linear transformation with characteristic polynomial
and minimal polynomial
is similar to one of the matrices on your list. No two distinct matrices on your list should be similar. [No proof is required.]
Paper 1, Section II, F
Part IB, 2016 commentLet denote the vector space over or of matrices with entries in . Let denote the trace functional, i.e., if , then
(a) Show that Tr is a linear functional.
(b) Show that for .
(c) Show that is unique in the following sense: If is a linear functional such that for each , then is a scalar multiple of the trace functional. If, in addition, , then Tr.
(d) Let be the subspace spanned by matrices of the form for . Show that is the kernel of Tr.
Paper 4, Section II, F
Part IB, 2016 comment(a) Let be a linear transformation between finite dimensional vector spaces over a field or .
Define the dual map of . Let be the dual map of . Given a subspace , define the annihilator of . Show that and the image of coincide. Conclude that the dimension of the image of is equal to the dimension of the image of . Show that .
(b) Now suppose in addition that are inner product spaces. Define the adjoint of . Let be linear transformations between finite dimensional inner product spaces. Suppose that the image of is equal to the kernel of . Then show that is an isomorphism.
Paper 3, Section II, F
Part IB, 2016 commentLet be a linear transformation defined on a finite dimensional inner product space over . Recall that is normal if and its adjoint commute. Show that being normal is equivalent to each of the following statements:
(i) where are self-adjoint operators and ;
(ii) there is an orthonormal basis for consisting of eigenvectors of ;
(iii) there is a polynomial with complex coefficients such that .
Paper 2, Section II, F
Part IB, 2016 commentLet denote the vector space over a field or of matrices with entries in . Given , consider the two linear transformations defined by
(a) Show that .
[For parts (b) and (c), you may assume the analogous result without proof.]
(b) Now let . For , write for the conjugate transpose of , i.e., . For , define the linear transformation by
Show that .
(c) Again let . Let be the set of Hermitian matrices. [Note that is not a vector space over but only over For and , define . Show that is an -linear operator on , and show that as such,
Paper 2, Section I, F
Part IB, 2017 commentState and prove the Rank-Nullity theorem.
Let be a linear map from to of rank 2 . Give an example to show that may be the direct sum of the kernel of and the image of , and also an example where this is not the case.
Paper 1, Section I, F
Part IB, 2017 commentState and prove the Steinitz Exchange Lemma.
Deduce that, for a subset of , any two of the following imply the third:
(i) is linearly independent
(ii) is spanning
(iii) has exactly elements
Let be a basis of . For which values of do form a basis of
Paper 4, Section I, F
Part IB, 2017 commentBriefly explain the Gram-Schmidt orthogonalisation process in a real finite-dimensional inner product space .
For a subspace of , define , and show that .
For which positive integers does
define an inner product on the space of all real polynomials of degree at most ?
Paper 1, Section II, F
Part IB, 2017 commentLet and be finite-dimensional real vector spaces, and let be a surjective linear map. Which of the following are always true and which can be false? Give proofs or counterexamples as appropriate.
(i) There is a linear map such that is the identity map on .
(ii) There is a linear map such that is the identity map on .
(iii) There is a subspace of such that the restriction of to is an isomorphism from to .
(iv) If and are subspaces of with then .
(v) If and are subspaces of with then .
Paper 2, Section II, F
Part IB, 2017 commentLet and be linear maps between finite-dimensional real vector spaces.
Show that the rank satisfies . Show also that . For each of these two inequalities, give examples to show that we may or may not have equality.
Now let have dimension and let be a linear map of rank such that . Find the rank of for each .
Paper 4, Section II, F
Part IB, 2017 commentWhat is the dual of a finite-dimensional real vector space ? If has a basis , define the dual basis, and prove that it is indeed a basis of .
[No results on the dimension of duals may be assumed without proof.]
Write down (without making a choice of basis) an isomorphism from to . Prove that your map is indeed an isomorphism.
Does every basis of arise as the dual basis of some basis of Justify your answer.
A subspace of is called separating if for every non-zero there is a with . Show that the only separating subspace of is itself.
Now let be the (infinite-dimensional) space of all real polynomials. Explain briefly how we may identify with the space of all real sequences. Give an example of a proper subspace of that is separating.
Paper 3, Section II, F
Part IB, 2017 commentLet be a quadratic form on a finite-dimensional real vector space . Prove that there exists a diagonal basis for , meaning a basis with respect to which the matrix of is diagonal.
Define the rank and signature of in terms of this matrix. Prove that and are independent of the choice of diagonal basis.
In terms of , and the dimension of , what is the greatest dimension of a subspace on which is zero?
Now let be the quadratic form on given by . For which points in is it the case that there is some diagonal basis for containing ?
Paper 1, Section I, E
Part IB, 2018 commentState the Rank-Nullity Theorem.
If and are linear maps and is finite dimensional, show that
If is another linear map, show that
Paper 2, Section I, E
Part IB, 2018 commentLet be a real vector space. Define the dual vector space of . If is a subspace of , define the annihilator of . If is a basis for , define its dual and prove that it is a basis for .
If has basis and is the subspace spanned by
give a basis for in terms of the dual basis .
Paper 4, Section I, E
Part IB, 2018 commentDefine a quadratic form on a finite dimensional real vector space. What does it mean for a quadratic form to be positive definite?
Find a basis with respect to which the quadratic form
is diagonal. Is this quadratic form positive definite?
Paper 1, Section II, E
Part IB, 2018 commentDefine a Jordan block . What does it mean for a complex matrix to be in Jordan normal form?
If is a matrix in Jordan normal form for an endomorphism , prove that
is the number of Jordan blocks of with .
Find a matrix in Jordan normal form for . [Consider all possible values of .]
Find a matrix in Jordan normal form for the complex matrix
assuming it is invertible.
Paper 2, Section II, E
Part IB, 2018 commentIf is an matrix over a field, show that there are invertible matrices and such that
for some , where is the identity matrix of dimension .
For a square matrix of the form with and square matrices, prove that .
If and have no common eigenvalue, show that the linear map
is injective.
Paper 4, Section II, E
Part IB, 2018 commentLet be a finite dimensional inner-product space over . What does it mean to say that an endomorphism of is self-adjoint? Prove that a self-adjoint endomorphism has real eigenvalues and may be diagonalised.
An endomorphism is called positive definite if it is self-adjoint and satisfies for all non-zero ; it is called negative definite if is positive definite. Characterise the property of being positive definite in terms of eigenvalues, and show that the sum of two positive definite endomorphisms is positive definite.
Show that a self-adjoint endomorphism has all eigenvalues in the interval if and only if is positive definite for all and negative definite for all .
Let be self-adjoint endomorphisms whose eigenvalues lie in the intervals and respectively. Show that all of the eigenvalues of lie in the interval .
Paper 3, Section II, E
Part IB, 2018 commentState and prove the Cayley-Hamilton Theorem.
Let be an complex matrix. Using division of polynomials, show that if is a polynomial then there is another polynomial of degree at most such that for each eigenvalue of and such that .
Hence compute the entry of the matrix when
Paper 4, Section I, F
Part IB, 2019 commentWhat is an eigenvalue of a matrix ? What is the eigenspace corresponding to an eigenvalue of ?
Consider the matrix
for a non-zero vector. Show that has rank 1 . Find the eigenvalues of and describe the corresponding eigenspaces. Is diagonalisable?
Paper 2, Section I, F
Part IB, 2019 commentIf and are finite-dimensional subspaces of a vector space , prove that
Let
Show that is 3 -dimensional and find a linear map such that
Paper 1, Section I, F
Part IB, 2019 commentDefine a basis of a vector space .
If has a finite basis , show using only the definition that any other basis has the same cardinality as .
Paper 1, Section II, F
Part IB, 2019 commentWhat is the adjugate adj of an matrix ? How is it related to
(a) Define matrices by
and scalars by
Find a recursion for the matrices in terms of and the 's.
(b) By considering the partial derivatives of the multivariable polynomial
show that
(c) Hence show that the 's may be expressed in terms of .
Paper 4, Section II, F
Part IB, 2019 commentIf is a finite-dimensional real vector space with inner product , prove that the linear map given by is an isomorphism. [You do not need to show that it is linear.]
If and are inner product spaces and is a linear map, what is meant by the adjoint of ? If is an orthonormal basis for is an orthonormal basis for , and is the matrix representing in these bases, derive a formula for the matrix representing in these bases.
Prove that .
If then the linear equation has no solution, but we may instead search for a minimising , known as a least-squares solution. Show that is such a least-squares solution if and only if it satisfies . Hence find a least-squares solution to the linear equation
Paper 3, Section II, F
Part IB, 2019 commentIf is a quadratic form on a finite-dimensional real vector space , what is the associated symmetric bilinear form ? Prove that there is a basis for with respect to which the matrix for is diagonal. What is the signature of ?
If is a subspace such that for all and all , show that defines a quadratic form on the quotient vector space . Show that the signature of is the same as that of .
If are vectors such that and , show that there is a direct sum decomposition such that the signature of is the same as that of .
Paper 2, Section II, F
Part IB, 2019 commentLet and be matrices over .
(a) Assuming that is invertible, show that and have the same characteristic polynomial.
(b) By considering the matrices , show that and have the same characteristic polynomial even when is singular.
(c) Give an example to show that the minimal polynomials and of and may be different.
(d) Show that and differ at most by a factor of . Stating carefully any results which you use, deduce that if is diagonalisable then so is .
Paper 1, Section I, F
Part IB, 2020 commentDefine what it means for two matrices and to be similar. Define the Jordan normal form of a matrix.
Determine whether the matrices
are similar, carefully stating any theorem you use.
Paper 1, Section II, F
Part IB, 2020 commentLet denote the vector space of matrices over a field or . What is the of a matrix ?
Show, stating accurately any preliminary results that you require, that if and only if is non-singular, i.e. .
Does have a basis consisting of non-singular matrices? Justify your answer.
Suppose that an matrix is non-singular and every entry of is either 0 or 1. Let be the largest possible number of 1 's in such an . Show that . Is this bound attained? Justify your answer.
[Standard properties of the adjugate matrix can be assumed, if accurately stated.]
Paper 2, Section II, F
Part IB, 2020 commentLet be a finite-dimensional vector space over a field. Show that an endomorphism of is idempotent, i.e. , if and only if is a projection onto its image.
Determine whether the following statements are true or false, giving a proof or counterexample as appropriate:
(i) If , then is idempotent.
(ii) The condition is equivalent to being idempotent.
(iii) If and are idempotent and such that is also idempotent, then .
(iv) If and are idempotent and , then is also idempotent.
Paper 1, Section I,
Part IB, 2021 commentLet be a vector space over , and let , symmetric bilinear form on .
Let . Show that is of dimension and . Show that if is a subspace with , then the restriction of , is nondegenerate.
Conclude that the dimension of is even.
Paper 4, Section I,
Part IB, 2021 commentLet be the vector space of by complex matrices.
Given , define the linear ,
(i) Compute a basis of eigenvectors, and their associated eigenvalues, when is the diagonal matrix
What is the rank of ?
(ii) Now let . Write down the matrix of the linear transformation with respect to the standard basis of .
What is its Jordan normal form?
Paper 1, Section II, E
Part IB, 2021 commentLet , and let .
(a) (i) Compute , for all .
(ii) Hence, or otherwise, compute , for all .
(b) Let be a finite-dimensional vector space over , and let . Suppose for some .
(i) Determine the possible eigenvalues of .
(ii) What are the possible Jordan blocks of ?
(iii) Show that if , there exists a decomposition
where , and .
Paper 2, Section II, E
Part IB, 2021 comment(a) Compute the characteristic polynomial and minimal polynomial of
Write down the Jordan normal form for .
(b) Let be a finite-dimensional vector space over be a linear map, and for , write
(i) Given , construct a non-zero eigenvector for in terms of .
(ii) Show that if are non-zero eigenvectors for with eigenvalues , and for all , then are linearly independent.
(iii) Show that if are all non-zero, and for all , then are linearly independent.
Paper 3, Section II, 9E
Part IB, 2021 comment(a) (i) State the rank-nullity theorem.
Let and be vector spaces. Write down the definition of their direct sum and the inclusions .
Now let and be subspaces of a vector space . Define by
Describe the quotient space as a subspace of .
(ii) Let , and let be the subspace of spanned by the vectors
and the subspace of spanned by the vectors
Determine the dimension of .
(b) Let be complex by matrices with .
Show that is a polynomial in of degree at most .
Show that if the polynomial is of degree precisely .
Give an example where but this polynomial is zero.
Paper 4, Section II, E
Part IB, 2021 comment(a) Let be a complex vector space of dimension .
What is a Hermitian form on ?
Given a Hermitian form, define the matrix of the form with respect to the basis of , and describe in terms of the value of the Hermitian form on two elements of .
Now let be another basis of . Suppose , and let . Write down the matrix of the form with respect to this new basis in terms of and .
Let . Describe the dimension of in terms of the matrix .
(b) Write down the matrix of the real quadratic form
Using the Gram-Schmidt algorithm, find a basis which diagonalises the form. What are its rank and signature?
(c) Let be a real vector space, and , be the matrix of this form in some basis.
Prove that the signature of , minus the number of negative eigenvalues.
Explain, using an example, why the eigenvalues themselves depend on the choice of a basis.